Problem Statement: Abandoned buildings and property is a common plight that urban centers face

According to the existing literature, abandoned buildings and land in a city are responsible for:

increasing crime rates (drug use, prostitution, etc.) 
increasing danger for public health and safety (collapsing parts of buildings, fires, etc.)
depressing nearby property values 
generating low property taxes; increasing costs for local governments (to secure, inspect, provide additional police and fire services, etc.) 

In other words, abandoned buildings and land in the city contribute to the downgrade of the quality of life, creating an unattractive urban environment for the citizens and visitors, as well as for future investors

Business Potential:

Real Estate Agencies can identify and proceed to buy these buildings
Municipalities may want to transform them for other purposes.
Investors can get the upper hand when searching for vacant buildings to buy.
Businesses can easily assess the surrounding area for possible dangers when census data are scarse or outdated.

Anyone can locate an abandonded building just by having a walk around but naturally this is inefficient.

https://www.tandfonline.com/doi/abs/10.1080/01431161.2019.1615655?journalCode=tres20

https://www.mdpi.com/2072-4292/10/12/1920

https://www.researchgate.net/publication/350617115_Detecting_individual_abandoned_houses_from_google_street_view_A_hierarchical_deep_learning_approach

https://www.arcgis.com/home/item.html?id=d3da5dd386d140cf93fc9ecbf8da5e31

https://www.arcgis.com/home/item.html?id=d3da5dd386d140cf93fc9ecbf8da5e31

https://www.nerdwallet.com/article/small-business/business-location

https://journalistsresource.org/politics-and-government/abandoned-buildings-revitalization/

https://www.fortunebuilders.com/vacant-property/

Can freely available AND easily accessible satellite data detect abandonded buildings?

We are going to approach this topic from different angles

1) Use Google Earth Engine since it offers a great variety of data, has a relatively good python api, and because anyone can access it

2) Use Sentinel Hub since it has a great python API and there are existing libraries for a lot of tasks. However it's not free

Methodology

Data Collection:

Get satellite data from Sentinel-2 (10m spatial resolution) and Landsat-8(30m spatial resolution)

Get satellite data from VIIRS(750m spatial resolution-yep bad) since absence of light may indicate abandonment

Possibly get google satellite images(bigger spatial resolution but outdated), can be used to identify characteristics of an abandonded building

OSM geojson for buildings

Preprocessing:

Cloud Masks for all data

Use neural network in order to improve spatial resolution of Sentinel-2 images (
https://up42.com/blog/tech/sentinel-2-superresolution
https://github.com/lanha/DSen2
https://github.com/up42/DSen2)

Possibly deblur VIIRs, or use them along with DMSP-OLS

Models:

Map NDVI,NDWI since abandonded places tend to have low vegetation or water

Measure Mean Radiance of Lights over a particular area

NPP-VIIRS PREPROCESSING from housing_vacancy-npp.pdf The minimum value of NPP-VIIRS data should be 0, representing regions without light intensity. However, values of a few pixels were lower than 0, caused by imaging error. In our study, these negative values were reset to 0. (2) Some abruptly large pixel also existed which might be extraordinary noises or pixels associated with the weak light reflected by high reflectance surfaces (e.g. snowcapped mountains). To distinguish these pixels, the maximum radiance value derived from the city centre artificially was first set as the upper threshold and then used to distinguish pixels with larger values. A Max Filter was used in these abnormal pixels to fix their values. In this way, the background noises of NPP-VIIRS NTL data was eliminated effectively.

https://www.esri.com/about/newsroom/arcnews/start-up-fights-urban-blight/

https://github.com/d-smit/sentinel2-deep-learning

https://github.com/sentinel-hub/multi-temporal-super-resolution

https://machinelearningmastery.com/a-gentle-introduction-to-particle-swarm-optimization/

https://www.tandfonline.com/doi/abs/10.1080/01431161.2017.1331060

https://github.com/KlemenKozelj/sentinel2-earth-observation

https://code.earthengine.google.com/?scriptPath=Examples%3ADatasets%2FCOPERNICUS_S2_SR

https://code.earthengine.google.com/?scriptPath=Examples%3ADatasets%2FSKYSAT_GEN-A_PUBLIC_ORTHO_MULTISPECTRAL

Get Sentinel-2 Data though Google Earth Engine

We focus on Thessaly

The image is not clipped near coastal areas which may be problematic for our analysis as we go further.

Get VIIRS Data through Google Earth Engine

I select data between 2015 and 2022(latest available trhough the engine) since I want to capture the whole covid situation as well as changes possibly related to politics etc.

# file for Greece greece0 = ee.Feature(ee.FeatureCollection("FAO/GAUL/2015/level0").filter(ee.Filter.eq('ADM0_NAME', 'Greece')).first()).geometry()

thessalia = ee.FeatureCollection("FAO/GAUL/2015/level2").filter(ee.Filter.eq('ADM1_NAME', 'Thessalia'))

print(f"There are {thessalia.size().getInfo()} level one admin units in Thessalia.") thessalia.getInfo()

A region that has lower-levels of light may be affected by background light that the VIIRS instrument is sensitive to can influence interpretation.

As we look at this scene, you can see the relatively high levels of “noise” present.

As discussed earlier, one approach to increase the signal / noise ratio would be to reduce data over time.

But if the noise levels persist throughout the time period, that may not reduce the noise much. And what if your analysis is specifically to look at December 2017?

Or what if you’re looking to conduct comparative analysis on these data or use them as inputs for a model for statistical inference?

In this case, you will very likely want to reduce the noise levels in your data in order for your algorithm to learn your data without over-fitting (in other words, a more sensitive model might “learn” the noise…which is generally bad). Additionally, many loss functions are subject to “exploding” or “vanishing” gradients if your data are not close to zero and scale

poi = ee.Geometry.Polygon([[[ 39.65925368635945,21.11034311414164], [39.62786608248593,21.121745024122855], [39.60034000685686,21.12322992837611], [39.58496058866589,21.137953857024677], [39.580256262903916,21.15488061550989], [39.56527812229155,21.166081893846382], [39.538599346735936,21.16055706829246], [39.51027513801262,21.169011461923077], [39.498155244777976,21.155781358741176], [39.48480473493292,21.153034556881146], [39.46943863854366,21.167700546861106], [21.174852870112257, 39.452195322381776], [21.171553152754843, 39.43618716001877], [21.178718950237318, 39.41894827384953], [21.20235665616196, 39.40387871570699], [21.225994300925528, 39.38880920157895], [21.240258983703445, 39.35432255291319], [21.275209471620975, 39.3337972870931], [21.327398793001453, 39.32761248053104], [21.345921841660804, 39.330867660350165], [21.345239645649674, 39.34462841269293], [21.35150913959843, 39.35755092753244], [21.361265596364863, 39.37005871386407], [21.380355009240894, 39.37598482735699], [21.39831180086926, 39.376568988722624], [21.432009284114198, 39.36707552911248], [21.463998840482606, 39.34957357665826], [21.48271370534117, 39.351972598832454], [21.50454993836163, 39.37162830392784], [21.526386131613272, 39.39128401640999], [21.552302393079845, 39.4291640940549], [21.570865670478067, 39.43240141531394], [21.601695906461664, 39.40951728703932], [21.63006019969542, 39.425204379637655], [21.666602569137673, 39.42896343594438], [21.68632514659234, 39.43750703730765], [21.707751199729085, 39.45404588469823], [21.72518180764335, 39.45190991651593], [21.73276224290483, 39.43731082748466], [21.749015695688772, 39.429841851309035], [21.768154175528267, 39.435696671321], [21.777402306073682, 39.461911683720636], [21.79074842463069, 39.47395575603591], [21.807585985667654, 39.4855672084231], [21.83081344946508, 39.49364269733661], [21.8709007384975, 39.49688890551485], [21.887742770255013, 39.508495953001734], [21.924343072358383, 39.512165745502706], [21.955157671897908, 39.52201820293759], [21.985972263102518, 39.53187054444512], [22.019094507602635, 39.519509912853344], [22.036511721511346, 39.51732050722246], [22.078986902962743, 39.53114819344666], [22.121462078170254, 39.544975817348536], [22.16449687341818, 39.544998152350395], [22.16694049895755, 39.57209160766255], [22.155971086520093, 39.587185720353965], [22.12226475056408, 39.596888728129265], [22.114840292925066, 39.62795527388407], [22.100334882309216, 39.6434773979652], [22.082890902607616, 39.64568021330115], [22.064242942582307, 39.64254544701414], [22.04855574641285, 39.65272108930522], [22.043985174581813, 39.680670780985714], [22.028864407802136, 39.693530796745065], [21.973495779671108, 39.68673512667089], [21.956586815127977, 39.675141484000854], [21.939686862605406, 39.67999299192468], [21.932137558888062, 39.694609928246024], [21.92403095050353, 39.73942833729418], [21.917123780914032, 39.82242555776728], [21.889102800529045, 39.8422997537275], [21.876256092029305, 39.83294462350949], [21.823687800363164, 39.839419205903106], [21.786318279925528, 39.83304048273634], [21.748948829700513, 39.82666167669445], [21.71331610770214, 39.82825804889596], [21.663358593285334, 39.83841592302598], [21.613401082428158, 39.84857376595426], [21.57642613284989, 39.85232498268147], [21.539451332907234, 39.85607620595456], [21.502476455018908, 39.85982741510876], [21.465501544543507, 39.863578611245856], [21.427010578610453, 39.85174416081156], [21.4066548217214, 39.840471567500366], [21.383462989304785, 39.815888547086985], [21.36367799107608, 39.807286951729054], [21.321744575162338, 39.7958403897652], [21.30366294520401, 39.79523847184679], [21.29130234101267, 39.80487897631912], [21.26384318023154, 39.810849727309105], [21.244615571241848, 39.80490132468022], [21.23480104596234, 39.792375711605345], [21.2150606616099, 39.78372947534038], [21.2047155354438, 39.76853734574027], [21.1984370514637, 39.75560592794437], [21.214775204257904, 39.69628654407656], [21.216224414597786, 39.66878279935999], [21.19816060165989, 39.66815402476963], [21.182344156264634, 39.67819598370166], [21.14330479486066, 39.68002416912158], [21.123586600269782, 39.6713869283674], [21.11034311414164, 39.65925368635945]]]).buffer(1600)
# Sklearn from sklearn.linear_model import LinearRegression cols = magnisia_lights['mean'] x= magnisia_lights.index.values.reshape(-1,1) y = magnisia_lights['mean'] # instantiate and fit lm_2 = LinearRegression() lm_2.fit(x, y) # print the coefficients print('Intercept: ', lm_2.intercept_) print('mean: ', lm_2.coef_[0])import datetime as dt magnisia_lights['date'] = pd.to_datetime(magnisia_lights['date']) magnisia_lights['date']=magnisia_lights['date'].map(dt.datetime.toordinal) x= magnisia_lights['date'].values.reshape(-1,1) y = magnisia_lights['mean'] # instantiate and fit lm_2 = LinearRegression() lm_2.fit(x, y) #lm_2.score(x,y) # print the coefficients print('Intercept: ', lm_2.intercept_) print('mean: ', lm_2.coef_[0])
#get times series of average radiance from viirs dnb monthly composites gldas2_1 = ee.ImageCollection('NASA/GLDAS/V021/NOAH/G025/T3H').filter(ee.Filter.date(startDate, endDate)) gldas2_1_ts = gldas2_1.getTimeSeriesByRegion(ee.Reducer.mean(), geometry = fc, scale = 10, bestEffort = True, maxPixels = 2e9, dateFormat = 'YYYYMMdd', tileScale = 2) gldas2_1_df = geemap.ee_to_pandas(gldas2_1_ts) gldas2_1_df['date'] = pd.to_datetime(gldas2_1_df['date'],infer_datetime_format = True)

Otan poi einai h t.oik

This may be due to led lights in the city centre during the christmas period. We are using mean which can be affected from outliers, i.e the leds are the outlier

Drop in 2019 could be due to COVID-19

Visualization of ndvi over poi

Time Series Analysis of NDVI over our poi

These changes are probably due to seasons' changing.

At this point we have obtained the ndvi and avg radiance over our area of interest. Ideally we want to create a model that can assess the amount of damage on a specified building\ Unfortunately there are some limitations mainly due to data avalability.

Find Abandoned Buildings

https://www.researchgate.net/publication/268816675_Assessment_of_Concrete_Surfaces_Using_Multi-Spectral_Image_Analysis

https://www.researchgate.net/publication/236941973_Intelligent_Concrete_Health_Monitoring_ICHM_An_Innovative_Method_for_Monitoring_Concrete_Structures_using_Multi_Spectral_Analysis_and_Image_Processing

Methodology

We hypothesise that when there is low avg radiance and higher mean ndvi there is a probability that a building is abandoned, since vacant buildings shouldn't have any lights open and since they are unmaintained grass etc should be present.

We set up a threshold for these values rad_thresh < mean_rad ndvi_thresh > mean_ndvi

https://github.com/awesome-spectral-indices/awesome-spectral-indices

https://www.isprs.org/proceedings/XXXI/congress/part7/321_XXXI-part7.pdf #ui

https://www.tandfonline.com/doi/abs/10.1080/01431161.2019.1615655?journalCode=tres20 #viirs house vacancy

VNP09GA = (ee.ImageCollection("NOAA/VIIRS/001/VNP09GA") .filterDate("2020-01-01","2021-01-01") .maskClouds() .scaleAndOffset() ) ts3 = VNP09GA.getTimeSeriesByRegion(reducer = [ee.Reducer.mean()], geometry = fc, scale = 10, bestEffort = True, maxPixels = 1e13, dateFormat = 'YYYYMMdd', tileScale = 2) tsPandas3 = geemap.ee_to_pandas(ts3) tsPandas3['date'] = pd.to_datetime(tsPandas3['date'],infer_datetime_format = True)tsPandas3

for i in range(len(combined)): if(combined['EVI'].loc[i] > combined['EVI'].mean() and combined['NDVI'].loc[i]>combined['NDVI'].mean() and combined['avg_rad'].loc[i]<combined['avg_rad'].mean()): combined['label'].loc[i] = labels[0] else: combined['label'].loc[i] = labels[1]

for i in range(len(combined)): print(combined['EVI'].loc(i)) if(combined['EVI'].loc(i) > combined['EVI'].mean()): print('true')

https://custom-scripts.sentinel-hub.com/custom-scripts/sentinel-2/ndvi/ NDVI < -0.2 #000000 -.2 < NDVI ≤ 0 #a50026 0 < NDVI ≤ .1 #d73027 .1 < NDVI ≤ .2 #f46d43 .2 < NDVI ≤ .3 #fdae61 .3 < NDVI ≤ .4 #fee08b .4 < NDVI ≤ .5 #ffffbf .5 < NDVI ≤ .6 #d9ef8b .6 < NDVI ≤ .7 #a6d96a .7 < NDVI ≤ .8 #66bd63 .8 < NDVI ≤ .9 #1a9850 .9 < NDVI ≤ 1.0 #006837

APO DW KAI KATW THA TO KANOUME WSTE TO DATAFRAME NA ALLAZEII ANALOGA ME TO TI THELEI NA DEI O XRHSTHS

AYTO DES TO

https://github.com/Toroitich6783/classification-using-GEE

https://github.com/afrozalopa/Project_Report_Group-1

https://geohackweek.github.io/GoogleEarthEngine/05-classify-imagery/

https://geemap.org/workshops/GeoPython_2021/#zonal-statistics

METHODOLOGY FOR PSEUDO-CLASSIFIER

get values(e.g ndvi etc) for different hand picked points of interest that we already know if they are abandoned or not compute the mean of each value in every category, for example compute all ndvi means for abandoned pois this mean will be the threshold for a place being abandoned or not then add a random polygon and see if this place is classified correctly

this probably works better using a ml classifier but can't work my head around it rn

papers pou einai xrhsima

https://github.com/DominiquePaul/GDP-Satellite-Prediction

https://towardsdatascience.com/hyperspectral-image-analysis-getting-started-74758c12f2e9

https://github.com/worldbank/OpenNightLights/tree/master/onl/tutorials

https://github.com/konkyrkos/hyperspectral-image-classification

https://github.com/yohman/workshop-python-spatial-stats

https://github.com/holderbp/pwpd

https://www.frontiersin.org/articles/10.3389/fbuil.2018.00032/full

https://icaarconcrete.org/wp-content/uploads/2020/11/15ICAAR-SanchezL-2.pdf

https://www.sciencedirect.com/science/article/pii/S2666549220300013

https://www.researchgate.net/publication/268816675_Assessment_of_Concrete_Surfaces_Using_Multi-Spectral_Image_Analysis

There are a few basic boolean operations that Google Earth Engine includes as built-ins for Images. The output is a binary file that sets a pixel value to 1 if it meets the condition and 0 if it doesnt. Those operations include:

lt: "less than"
lte: "less than or equal to"
eq: "equal to"
gt: "greater than or equal to"
gte: "greater than

The method compares the Image object the method is called on as the left-hand side of the comparison with the value passed as the input argument to the function on the right-hand side. This input can be a scalar value that will be compared to all pixels in the image, or another image that will be used as an element-wise / pixel-wise comparison.

Based on our histogram of radiance in the sample region, it might be interesting mask all values that are not greater than or equal to 4.

Now we can see variation in radiance in a way that sheds "light" (apologies for the pun!) on activity around denser urban areas.

Later in this tutorial, we'll look at calculating the difference in two Images -- and this is a another potential for leveraging conditional operators.M

Ayta apo katw den exoun toso nohma pros to paron

%%html